6,909 research outputs found

    REVIEW ARTICLE Mycobacterium bovis (bovine tuberculosis) infection in North American wildlife: current status and opportunities for mitigation of risks of further infection in wildlife populations

    Get PDF
    Mycobacterium bovis (M. bovis), the causative agent of bovine tuberculosis, has been identified in nine geographically distinct wildlife populations in North America and Hawaii and is endemic in at least three populations, including members of the Bovidae, Cervidae, and Suidae families. The emergence of M. bovis in North American wildlife poses a serious and growing risk for livestock and human health and for the recreational hunting industry. Experience in many countries, including the USA and Canada, has shown that while M. bovis can be controlled when restricted to livestock species, it is almost impossible to eradicate once it has spread into ecosystems with free-ranging maintenance hosts. Therefore, preventing transmission of M. bovis to wildlife may be the most effective way to mitigate economic and health costs of this bacterial pathogen. Here we review the status of M. bovis infection in wildlife of North America and identify risks for its establishment in uninfected North American wildlife populations where eradication or control would be difficult and costly. We identified four common risk factors associated with establishment of M. bovis in uninfected wildlife populations in North America, (1) commingling of infected cattle with susceptible wildlife, (2) supplemental feeding of wildlife, (3) inadequate surveillance of at-risk wildlife, and (4) unrecognized emergence of alternate wildlife species as successful maintenance hosts. We then propose the use of integrated and adaptive disease management to mitigate these risk factors to prevent establishment of M. bovis in susceptible North American wildlife species

    Modification of Blood by Zeolites for Transfusion Purposes

    Get PDF
    Richard Lower made the first blood transfusion in 1665, but the wide use of the technique in medicine is quite recent. One of the difficulties lay in the coagulation of the blood during the transfer. Accordingly, the blood is generally treated with sodium citrate to prevent this. It is known that calcium in the blood has something to do with the coagulation and probably with the agglutination. It was decided to study blood from which the calcium had been removed. To remove the calcium we adopted the principle used in removing calcium from hard water, that is, passing· it over a zeolite. Blood is viscous-especially coagulated blood-hence it was necessary to use suction to force it through the tube filled with zeolite. The tube was attached to a filter flask which in turn was attached to a filter pump. In this way the blood was passed through the zeolite bed successfully

    Measured performance of the new University of California gamma ray telescope

    Get PDF
    The design of the new medium energy balloon-borne gamma ray telescope is discussed. This telescope is sensitive to 1-30 MeV gamma rays. The results of the initial calibration are described. The position and energy resolutions of 32 plastic and NaI(Tl) scintillator bars, each 100 cm long are discussed. The telescope's measured angular and energy resolutions as a function of incident angle are compared with detailed Monte Carlo calculations at 1.37, 2.75 and 6.13 MeV. The expected resolutions are 5 deg FHWM and 8% at 2.75 MeV. The expected area-efficiency is 250 cm

    Engaging Empirical Dynamic Modeling to Detect Intrusions in Cyber-Physical Systems

    Get PDF
    Modern cyber-physical systems require effective intrusion detection systems to ensure adequate critical infrastructure protection. Developing an intrusion detection capability requires an understanding of the behavior of a cyber-physical system and causality of its components. Such an understanding enables the characterization of normal behavior and the identification and reporting of anomalous behavior. This chapter explores a relatively new time series analysis technique, empirical dynamic modeling, that can contribute to system understanding. Specifically, it examines if the technique can adequately describe causality in cyber-physical systems and provides insights into it serving as a foundation for intrusion detection

    A review of mammographic positioning image quality criteria for the craniocaudal projection

    Get PDF
    Detection of breast cancer is reliant on optimal breast positioning and the production of quality images. Two projections, the mediolateral (MLO) and craniocaudal (CC), are routinely performed. Determination of successful positioning and inclusion of all breast tissue is achieved through meeting stated image quality criteria. For the CC view, current image quality criteria are inconsistent. Absence of reliable anatomical markers, other than the nipple, further contribute to difficulties in assessing the quality of CC views. The aim of this paper was to explore published international quality standards to identify and find the origin of any CC positioning criteria which might provide for quantitative assessment. The pectoralis major (pectoral) muscle was identified as a key posterior anatomical structure to establish optimum breast tissue inclusion on mammographic projections. It forms the first two of the three main CC metrics that are frequently reported 1. visualisation of the pectoral muscle, 2. measurement of the posterior nipple line (PNL) and 3. depiction of retroglandular fat. This literature review explores the origin of the three metrics, and discusses three key publications, spanning 1992 to 1994, on which subsequent image quality standards have been based. The evidence base to support published CC metrics is sometimes not specified and more often the same set of publications are cited, most often without critical evaluation. To conclude, there remains uncertainty if the metrics explored for the CC view support objective evaluation and reproducibility to confirm optimal breast positioning and quality images

    On the Complexity of tt-Closeness Anonymization and Related Problems

    Full text link
    An important issue in releasing individual data is to protect the sensitive information from being leaked and maliciously utilized. Famous privacy preserving principles that aim to ensure both data privacy and data integrity, such as kk-anonymity and ll-diversity, have been extensively studied both theoretically and empirically. Nonetheless, these widely-adopted principles are still insufficient to prevent attribute disclosure if the attacker has partial knowledge about the overall sensitive data distribution. The tt-closeness principle has been proposed to fix this, which also has the benefit of supporting numerical sensitive attributes. However, in contrast to kk-anonymity and ll-diversity, the theoretical aspect of tt-closeness has not been well investigated. We initiate the first systematic theoretical study on the tt-closeness principle under the commonly-used attribute suppression model. We prove that for every constant tt such that 0t<10\leq t<1, it is NP-hard to find an optimal tt-closeness generalization of a given table. The proof consists of several reductions each of which works for different values of tt, which together cover the full range. To complement this negative result, we also provide exact and fixed-parameter algorithms. Finally, we answer some open questions regarding the complexity of kk-anonymity and ll-diversity left in the literature.Comment: An extended abstract to appear in DASFAA 201
    corecore